105 research outputs found

    Global analysis of parallel analog networks with retarded feedback

    Get PDF
    We analyze the retrieval dynamics of analog ‘‘neural’’ networks with clocked sigmoid elements and multiple signal delays. Proving a conjecture by Marcus and Westervelt, we show that for delay-independent symmetric coupling strengths, the only attractors are fixed points and periodic limit cycles. The same result applies to a larger class of asymmetric networks that may be utilized to store temporal associations with a cyclic structure. We discuss implications for various learning schemes in the space-time domain

    Input-driven components of spike-frequency adaptation can be unmasked in vivo

    Get PDF
    Spike-frequency adaptation affects the response characteristics of many sensory neurons, and different biophysical processes contribute to this phenomenon. Many cellular mechanisms underlying adaptation are triggered by the spike output of the neuron in a feedback manner (e.g., specific potassium currents that are primarily activated by the spiking activity). In contrast, other components of adaptation may be caused by, in a feedforward way, the sensory or synaptic input, which the neuron receives. Examples include viscoelasticity of mechanoreceptors, transducer adaptation in hair cells, and short-term synaptic depression. For a functional characterization of spike-frequency adaptation, it is essential to understand the dependence of adaptation on the input and output of the neuron. Here, we demonstrate how an input-driven component of adaptation can be uncovered in vivo from recordings of spike trains in an insect auditory receptor neuron, even if the total adaptation is dominated by output-driven components. Our method is based on the identification of different inputs that yield the same output and sudden switches between these inputs. In particular, we determined for different sound frequencies those intensities that are required to yield a predefined steady-state firing rate of the neuron. We then found that switching between these sound frequencies causes transient deviations of the firing rate. These firing-rate deflections are evidence of input-driven adaptation and can be used to quantify how this adaptation component affects the neural activity. Based on previous knowledge of the processes in auditory transduction, we conclude that for the investigated auditory receptor neurons, this adaptation phenomenon is of mechanical origin

    A universal model for spike-frequency adaptation

    Get PDF
    Spike-frequency adaptation is a prominent feature of neural dynamics. Among other mechanisms, various ionic currents modulating spike generation cause this type of neural adaptation. Prominent examples are voltage-gated potassium currents (M-type currents), the interplay of calcium currents and intracellular calcium dynamics with calcium-gated potassium channels (AHP-type currents), and the slow recovery from inactivation of the fast sodium current. While recent modeling studies have focused on the effects of specific adaptation currents, we derive a universal model for the firing-frequency dynamics of an adapting neuron that is independent of the specific adaptation process and spike generator. The model is completely defined by the neuron's onset f-I curve, the steady-state f-I curve, and the time constant of adaptation. For a specific neuron, these parameters can be easily determined from electrophysiological measurements without any pharmacological manipulations. At the same time, the simplicity of the model allows one to analyze mathematically how adaptation influences signal processing on the single-neuron level. In particular, we elucidate the specific nature of high-pass filter properties caused by spike-frequency adaptation. The model is limited to firing frequencies higher than the reciprocal adaptation time constant and to moderate fluctuations of the adaptation and the input current. As an extension of the model, we introduce a framework for combining an arbitrary spike generator with a generalized adaptation current

    Traveling waves of excitation in neural field models

    Get PDF
    Field models provide an elegant mathematical framework to analyze large-scale patterns of neural activity. On the microscopic level, these models are usually based on either a firing-rate picture or integrate-and-fire dynamics. This article shows that in spite of the large conceptual differences between the two types of dynamics, both generate closely related plane-wave solutions. Furthermore, for a large group of models, estimates about the network connectivity derived from the speed of these plane waves only marginally depend on the assumed class of microscopic dynamics. We derive quantitative results about this phenomenon and discuss consequences for the interpretation of experimental data

    The iso-response method

    Get PDF
    Throughout the nervous system, neurons integrate high-dimensional input streams and transform them into an output of their own. This integration of incoming signals involves filtering processes and complex non-linear operations. The shapes of these filters and non-linearities determine the computational features of single neurons and their functional roles within larger networks. A detailed characterization of signal integration is thus a central ingredient to understanding information processing in neural circuits. Conventional methods for measuring single-neuron response properties, such as reverse correlation, however, are often limited by the implicit assumption that stimulus integration occurs in a linear fashion. Here, we review a conceptual and experimental alternative that is based on exploring the space of those sensory stimuli that result in the same neural output. As demonstrated by recent results in the auditory and visual system, such iso-response stimuli can be used to identify the non-linearities relevant for stimulus integration, disentangle consecutive neural processing steps, and determine their characteristics with unprecedented precision. Automated closed-loop experiments are crucial for this advance, allowing rapid search strategies for identifying iso-response stimuli during experiments. Prime targets for the method are feed-forward neural signaling chains in sensory systems, but the method has also been successfully applied to feedback systems. Depending on the specific question, “iso-response” may refer to a predefined firing rate, single-spike probability, first-spike latency, or other output measures. Examples from different studies show that substantial progress in understanding neural dynamics and coding can be achieved once rapid online data analysis and stimulus generation, adaptive sampling, and computational modeling are tightly integrated into experiments

    Earthquake cycles and neural reverberations

    Get PDF
    Driven systems of interconnected blocks with stick-slip friction capture main features of earthquake processes. The microscopic dynamics closely resemble those of spiking nerve cells. We analyze the differences in the collective behavior and introduce a class of solvable models. We prove that the models exhibit rapid phase locking, a phenomenon of particular interest to both geophysics and neurobiology. We study the dependence upon initial conditions and system parameters, and discuss implications for earthquake modeling and neural computation

    Event Timing in Associative Learning

    Get PDF
    Associative learning relies on event timing. Fruit flies for example, once trained with an odour that precedes electric shock, subsequently avoid this odour (punishment learning); if, on the other hand the odour follows the shock during training, it is approached later on (relief learning). During training, an odour-induced Ca++ signal and a shock-induced dopaminergic signal converge in the Kenyon cells, synergistically activating a Ca++-calmodulin-sensitive adenylate cyclase, which likely leads to the synaptic plasticity underlying the conditioned avoidance of the odour. In Aplysia, the effect of serotonin on the corresponding adenylate cyclase is bi-directionally modulated by Ca++, depending on the relative timing of the two inputs. Using a computational approach, we quantitatively explore this biochemical property of the adenylate cyclase and show that it can generate the effect of event timing on associative learning. We overcome the shortage of behavioural data in Aplysia and biochemical data in Drosophila by combining findings from both systems

    Energy integration describes sound-intensity coding in an insect auditory system

    Get PDF
    We investigate the transduction of sound stimuli into neural responses and focus on locust auditory receptor cells. As in other mechanosensory model systems, these neurons integrate acoustic inputs over a fairly broad frequency range. To test three alternative hypotheses about the nature of this spectral integration (amplitude, energy, pressure), we perform intracellular recordings while stimulating with superpositions of pure tones. On the basis of online data analysis and automatic feedback to the stimulus generator, we systematically explore regions in stimulus space that lead to the same level of neural activity. Focusing on such iso-firing-rate regions allows for a rigorous quantitative comparison of the electrophysiological data with predictions from the three hypotheses that is independent of nonlinearities induced by the spike dynamics. We find that the dependence of the firing rates of the receptors on the composition of the frequency spectrum can be well described by an energy-integrator model. This result holds at stimulus onset as well as for the steady-state response, including the case in which adaptation effects depend on the stimulus spectrum. Predictions of the model for the responses to bandpass-filtered noise stimuli are verified accurately. Together, our data suggest that the sound-intensity coding of the receptors can be understood as a three-step process, composed of a linear filter, a summation of the energy contributions in the frequency domain, and a firing-rate encoding of the resulting effective sound intensity. These findings set quantitative constraints for future biophysical models

    Timescale-invariant representation of acoustic communication signals by a bursting neuron

    Get PDF
    Acoustic communication often involves complex sound motifs in which the relative durations of individual elements, but not their absolute durations, convey meaning. Decoding such signals requires an explicit or implicit calculation of the ratios between time intervals. Using grasshopper communication as a model, we demonstrate how this seemingly difficult computation can be solved in real time by a small set of auditory neurons. One of these cells, an ascending interneuron, generates bursts of action potentials in response to the rhythmic syllable-pause structure of grasshopper calls. Our data show that these bursts are preferentially triggered at syllable onset; the number of spikes within the burst is linearly correlated with the duration of the preceding pause. Integrating the number of spikes over a fixed time window therefore leads to a total spike count that reflects the characteristic syllable-to-pause ratio of the species while being invariant to playing back the call faster or slower. Such a timescale-invariant recognition is essential under natural conditions, because grasshoppers do not thermoregulate; the call of a sender sitting in the shade will be slower than that of a grasshopper in the sun. Our results show that timescale-invariant stimulus recognition can be implemented at the single-cell level without directly calculating the ratio between pulse and interpulse durations
    corecore